Combining Neural Network Voting Classiiers and Error Correcting Output Codes

نویسنده

  • Kurt Hornik
چکیده

We show that error correcting output codes (ECOC) can further improve the eeects of error dependent adaptive resampling methods such as arc-lh. In traditional one-inn coding, the distance between two binary class labels is rather small, whereas ECOC are chosen to maximize this distance. We compare one-inn and ECOC on a multiclass data set using standard MLPs and bagging and arcing voting committees.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Combining Neural Network Voting Classifiers and Error Correcting Output Codes

Papers published in this report series are preliminary versions of journal articles and not for quotations. Abstract We show that error correcting output codes (ECOC) can further improve the eeects of error dependent adaptive resampling methods such as arc-lh. In traditional one-inn coding, the distance between two binary class labels is rather small, whereas ECOC are chosen to maximize this di...

متن کامل

Combining Nearest Neighbor Classi ers Through Multiple

Combining multiple classiiers is an eeective technique for improving accuracy. There are many general combining algorithms, such as Bagging or Error Correcting Output Coding, that signiicantly improve classiiers like decision trees, rule learners, or neural networks. Unfortunately, many combining methods do not improve the nearest neighbor classiier. In this paper, we present MFS, a combining a...

متن کامل

Extending Local Learners with Error-correcting Output Codes Extending Local Learners with Error-correcting Output Codes

Error-correcting output codes (ECOCs) represent classes with a set of output bits, where each bit encodes a binary classiication task corresponding to a unique partition of the classes. Algorithms that use ECOCs learn the function corresponding to each bit, and combine them to generate class predictions. ECOCs can reduce both variance and bias errors for multiclass classiication tasks when the ...

متن کامل

Nearest neighbor classification from multiple feature subsets

Combining multiple classiiers is an eeective technique for improving accuracy. There are many general combining algorithms, such as Bagging, Boosting, or Error Correcting Output Coding, that signiicantly improve classiiers like decision trees, rule learners, or neural networks. Unfortunately, these combining methods do not improve the nearest neighbor classiier. In this paper, we present MFS, a...

متن کامل

Applying Multiple Complementary Neural Networks to Solve Multiclass Classification Problem

In this paper, a multiclass classification problem is solved using multiple complementary neural networks. Two techniques are applied to multiple complementary neural networks which are one-against-all and error correcting output codes. We experiment our proposed techniques using an extremely imbalance data set named glass from the UCI machine learning repository. It is found that the combinati...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1997